Confidence-Aware Paced-Curriculum Learning by Label Smoothing for Surgical Scene Understanding
نویسندگان
چکیده
Curriculum learning and self-paced are the training strategies that gradually feed samples from easy to more complex. They have captivated increasing attention due their excellent performance in robotic vision. Most recent works focus on designing curricula based difficulty levels input or smoothing feature maps. However, labels control utility a curriculum manner is still unexplored. In this work, we design paced by label (P-CBLS) using with uniform (ULS) for classification tasks fuse spatially varying (SVLS) semantic segmentation manner. ULS SVLS, bigger factor value enforces heavy penalty true limits less information. Therefore, (CBLS). We set at beginning of decreased it zero model lower higher. also designed confidence-aware pacing function combined our CBLS investigate benefits various curricula. The proposed techniques validated four surgery datasets multi-class, multi-label classification, captioning, tasks. robustness method corrupting validation data into different severity levels. Our extensive analysis shows improves prediction accuracy robustness. code publicly available https://github.com/XuMengyaAmy/P-CBLS. Note Practitioners —The motivation article improve deep neural networks safety-critical applications such as controlling ability allowing imitate cognitive process humans animals. approaches do not add parameters require additional computational resources.
منابع مشابه
Self-Paced Curriculum Learning
Curriculum learning (CL) or self-paced learning (SPL) represents a recently proposed learning regime inspired by the learning process of humans and animals that gradually proceeds from easy to more complex samples in training. The two methods share a similar conceptual learning paradigm, but differ in specific learning schemes. In CL, the curriculum is predetermined by prior knowledge, and rema...
متن کاملScreenerNet: Learning Self-Paced Curriculum for Deep Neural Networks
We propose to learn a curriculum or a syllabus for supervised learning with deep neural networks. Specifically, we learn weights for each sample in training by an attached neural network, called ScreenerNet, to the original network and jointly train them in an end-to-end fashion. We show the networks augmented with our ScreenerNet achieve early convergence with better accuracy than the state-of...
متن کاملDeep learning for multi-label scene classification
Scene classification is an important topic in computer vision. For similar weather conditions, there are some obstacles for extracting features from outdoor images. In this thesis, I present a novel approach to classify cloudy and sunny weather images. Inspired by recent study of a deep convolutional neural network and the spatial pyramid matching, I generate a model based on the ImageNet datas...
متن کاملA Self-Paced Regularization Framework for Multi-Label Learning
In this brief, we propose a novel multilabel learning framework, called multilabel self-paced learning, in an attempt to incorporate the SPL scheme into the regime of multilabel learning. Specifically, we first propose a new multilabel learning formulation by introducing a self-paced function as a regularizer, so as to simultaneously prioritize label learning tasks and instances in each iterati...
متن کاملRobust Learning and Segmentation for Scene Understanding
This thesis demonstrates methods useful in learning to understand images from only a few examples, but they are by no means limited to this application. Boosting techniques are popular because they learn effective classification functions and identify the most relevant features at the same time. However, in general, they overfit and perform poorly on data sets that contain many features, but fe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Automation Science and Engineering
سال: 2023
ISSN: ['1545-5955', '1558-3783']
DOI: https://doi.org/10.1109/tase.2023.3276361